LEARNING UNAMBIGUOUS REDUCED SEQUENCEDESCRIPTIONSIn
نویسندگان
چکیده
You want your neural net algorithm to learn sequences? Do not just use conventional gradient descent (or approximations thereof) in recurrent nets, time-delay nets etc. Instead, use your sequence learning algorithm to implement the following method: No matter what your nal goals are, train a network to predict its next input from the previous ones. Since only unpredictable inputs convey new information, ignore all predictable inputs but let all unexpected inputs (plus information about the time step at which they occurred) become inputs to a higher-level network of the same kind (working on a slower, self-adjusting time scale). Go on building a hierarchy of such networks. This principle reduces the descriptions of event sequences without loss of information, thus easing supervised or reinforcement learning tasks. Experiments show that systems based on this principle can require less computation per time step and many fewer training sequences than conventional training algorithms for recurrent nets. I also discuss a method involving only two recurrent networks which tries to collapse a multi-level predictor hierarchy into a single recurrent net.
منابع مشابه
PP-Attachment Disambiguation Boosted by a Gigantic Volume of Unambiguous Examples
We present a PP-attachment disambiguation method based on a gigantic volume of unambiguous examples extracted from raw corpus. The unambiguous examples are utilized to acquire precise lexical preferences for PP-attachment disambiguation. Attachment decisions are made by a machine learning method that optimizes the use of the lexical preferences. Our experiments indicate that the precise lexical...
متن کاملPAC-Learning Unambiguous NTS Languages
Non-terminally separated (NTS) languages are a subclass of deterministic context free languages where there is a stable relationship between the substrings of the language and the non-terminals of the grammar. We show that when the distribution of samples is generated by a PCFG, based on the same grammar as the target language, the class of unambiguous NTS languages is PAC-learnable from positi...
متن کاملUnconscious perception: attention, awareness, and control.
Conscious perception is substantially overestimated when standard measurement techniques are used. That overestimation has contributed to the controversial nature of studies of unconscious perception. A process-dissociation procedure (L. L. Jacoby, 1991) was used for separately estimating the contribution of conscious and unconscious perception to performance of a stem-completion task. Unambigu...
متن کاملThe Performance of Evolutionary Artificial Neural Networks in Unambiguous and Ambiguous Learning Situations
متن کامل
Syntactic ambiguity resolution in discourse: modeling the effects of referential context and lexical frequency.
Sentences with temporarily ambiguous reduced relative clauses (e.g., The actress selected by the director believed that...) were preceded by discourse contexts biasing a main clause or a relative clause. Eye movements in the disambiguating region (by the director) revealed that, in the relative clause biasing contexts, ambiguous reduced relatives were no more difficult to process than unambiguo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1992